The Vanishing/Exploding Gradient Problem in Deep Neural Networks

Understanding the obstacles that faces us when building deep neural networks

Kurtis Pykes
Towards Data Science
4 min readMay 17, 2020

--

A difficulty that we are faced with when training deep Neural Networks is that of vanishing or exploding gradients. For a long period of time, this obstacle was a major barrier for training large networks. However, by the end of this article, not only will you know what the problem of vanishing and exploding gradient is, you will also know how to detect when you are facing these problems and some…

--

--

I ghostwrite educational email courses for entrepreneurs in AI. Self-taught machine learning practitioner. Been a writer in AI since 2020.